Simulated annealing for optimization of graphs and sequences

نویسندگان

چکیده

Optimization of discrete structures aims at generating a new structure with the better property given an existing one, which is fundamental problem in machine learning. Different from continuous optimization, realistic applications optimization (e.g., text generation) are very challenging due to complex and long-range constraints, including both syntax semantics, structures. In this work, we present SAGS, novel Simulated Annealing framework for Graph Sequence optimization. The key idea integrate powerful neural networks into metaheuristics simulated annealing, SA) restrict search space We start by defining sophisticated objective function, involving interest pre-defined constraints grammar validity). SAGS searches towards performing sequence local edits, where deep generative propose editing content thus can control quality editing. evaluate on paraphrase generation molecule graph respectively. Extensive results show that our approach achieves state-of-the-art performance compared methods terms automatic human evaluations. Further, also significantly outperforms all previous generation.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Simulated Annealing for Convex Optimization

We apply the method known as simulated annealing to the following problem in convex optimization: minimize a linear function over an arbitrary convex set, where the convex set is specified only by a membership oracle. Using distributions from the Boltzmann-Gibbs family leads to an algorithm that needs only O∗( √ n) phases for instances in R. This gives an optimization algorithm that makes O∗(n4...

متن کامل

Optimization by simulated annealing.

There is a deep and useful connection between statistical mechanics (the behavior of systems with many degrees of freedom in thermal equilibrium at a finite temperature) and multivariate or combinatorial optimization (finding the minimum of a given function depending on many parameters). A detailed analogy with annealing in solids provides a framework for optimization of the properties of very ...

متن کامل

Simulated Annealing and Global Optimization

Nelder-Mead (when you don’t know ∇f ) and steepest descent/conjugate gradient (when you do). Both of these methods are based on attempting to generate a sequence of positions xk with monotonically decreasing f(xk) in the hopes that the xk → x∗, the global minimum for f . If f is a convex function (this happens surprisingly often), and has only one local minimum, these methods are exactly the ri...

متن کامل

Global optimization and simulated annealing

In this paper we are concerned with global optimization, which can be defined as the problem of finding points on a bounded subset of N" in which some real valued function f assumes its optimal (maximal or minimal) value. We present a stochastic approach which is based on the simulated annealing algorithm. The approach closely follows the formulation of the simulated annealing algorithm as orig...

متن کامل

Global optimization and simulated annealing

In this paper we are concerned with global optimization, which can be defined as the problem of finding points on a bounded subset of IRn in which some real valued functionf assumes its optimal (i.e. maximal or minimal) value. We present a stochastic approach which is based on the simulated annealing algorithm. The approach closely follows the formulation of the simulated annealing algorithm as...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Neurocomputing

سال: 2021

ISSN: ['0925-2312', '1872-8286']

DOI: https://doi.org/10.1016/j.neucom.2021.09.003